14 research outputs found

    Cross-layer scheduling and resource allocation for heterogeneous traffic in 3G LTE

    Get PDF
    3G long term evolution (LTE) introduces stringent needs in order to provide different kinds of traffic with Quality of Service (QoS) characteristics. The major problem with this nature of LTE is that it does not have any paradigm scheduling algorithm that will ideally control the assignment of resources which in turn will improve the user satisfaction. This has become an open subject and different scheduling algorithms have been proposed which are quite challenging and complex. To address this issue, in this paper, we investigate how our proposed algorithm improves the user satisfaction for heterogeneous traffic, that is, best-effort traffic such as file transfer protocol (FTP) and real-time traffic such as voice over internet protocol (VoIP). Our proposed algorithm is formulated using the cross-layer technique. The goal of our proposed algorithm is to maximize the expected total user satisfaction (total-utility) under different constraints. We compared our proposed algorithm with proportional fair (PF), exponential proportional fair (EXP-PF), and U-delay. Using simulations, our proposed algorithm improved the performance of real-time traffic based on throughput, VoIP delay, and VoIP packet loss ratio metrics while PF improved the performance of best-effort traffic based on FTP traffic received, FTP packet loss ratio, and FTP throughput metrics

    Optimizing VoIP in 3G LTE using cross-layered scheduling and resource allocation schemes

    No full text
    Wireless communication has been the major part of the wireless industry that have grown very fast and captured the attention of many researchers. it has evolved from different generations and the current 3G Long Term Evolution (LTE) is the main focus in this project. LTE is an emerging and promising technology that aims at providing broadband ubiquitous Internet access and improving multimedia services. This is achieved through streamlining the system for packet services, since LTE is an all Internet Protocol (IP) based network. The fact that 3G LTE is a packet based network brings about some improvements in the form of higher bit rates, lower latencies, etc. However, several technical challenges are expected to arise when voice traffic is transmitted over LTE network. Voice transmission over 3G LTE has brought major concerns due to the fact that voice traffic, similar to any other real-time traffic, is affected by technical issues such as end-ta-end delay or latency, jitter, and packet loss which adversely affect the Quality of Service (QoS). This has led to the development of different scheduling and resource allocation schemes with the aim of improving the QoS of voice traffic when transmitted over a 3G LTE network. This research, studies how cross-layered scheduling and resource allocation techniques can improve the QoS of voice traffic when transmitted over 3G LTE network. This research proposes a novel cross-layer scheduling and resource allocation algorithm in order to improve the performance gains and QoS of voice when transmitted over a 3G LTE Network. The novelty of the proposed algorithm is that it projects the voice packet scheduling and resource allocation problem as a constrained optimization problem in contrast to the existing techniques. This optimization problem is formulated using the transmission rate and channel state information at the physical layer as well as the queuing state information such as the queue length at the Medium Access Control (MAC) layer. The a1gorithmic implementation of the obtained solution is provided and this research also investigates the performance, mobility, complexity, and fairness issues of the proposed cross-layer scheduling algorithm under different conditions such as, VoIP delay, packet loss, etc.EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    Evaluation of a new scheduling scheme for VoIP with mobility in 3G LTE

    No full text

    Complexity and fairness analysis of a new scheduling scheme for VoIP in 3G LTE

    No full text
    Abstract — 3G Long Term Evolution is an emerging and promising technology that aims at providing broadband ubiquitous internet access and improving multimedia services. This is achieved through streamlining the system for packet services since long term evolution is an all Internet protocol based network. The fact that 3G long term evolution is a packet based network brings along some improvements in the form of higher bit rates, lower latencies, and a variety of service offerings. However, some technical challenges are expected to arise when voice traffic is transmitted over a long term evolution network. This has become an interesting area of research and different types of resource management schemes have been developed which are quite challenging and complex. In this paper, we analyze the complexity and fairness of our proposed scheduling scheme for voice over internet protocol in 3G long term evolution called voice over internet protocol optimization scheduling algorithm. We compare it with other algorithms in literature. There is second order complexity in the number of users based on quality feedback, queue length metrics, and there is linear complexity in resource blocks using voice over internet protocol optimization scheduling algorithm. Simulation results also showed approximately 10 – 20 percent improvement in fairness and performance based on the fairness index and throughput

    A Multinomial DGA Classifier for Incipient Fault Detection in Oil-Impregnated Power Transformers

    No full text
    This study investigates the use of machine-learning approaches to interpret Dissolved Gas Analysis (DGA) data to find incipient faults early in oil-impregnated transformers. Transformers are critical pieces of equipment in transmitting and distributing electrical energy. The failure of a single unit disturbs a huge number of consumers and suppresses economic activities in the vicinity. Because of this, it is important that power utility companies accord high priority to condition monitoring of critical assets. The analysis of dissolved gases is a technique popularly used for monitoring the condition of transformers dipped in oil. The interpretation of DGA data is however inconclusive as far as the determination of incipient faults is concerned and depends largely on the expertise of technical personnel. To have a coherent, accurate, and clear interpretation of DGA, this study proposes a novel multinomial classification model christened KosaNet that is based on decision trees. Actual DGA data with 2912 entries was used to compute the performance of KosaNet against other algorithms with multiclass classification ability namely the decision tree, k-NN, Random Forest, Naïve Bayes, and Gradient Boost. Investigative results show that KosaNet demonstrated an improved DGA classification ability particularly when classifying multinomial data

    A new scheduling scheme for voice awareness in 3G LTE

    No full text

    Implementation of IoT Framework with Data Analysis Using Deep Learning Methods for Occupancy Prediction in a Building

    No full text
    Many countries worldwide face challenges in controlling building incidence prevention measures for fire disasters. The most critical issues are the localization, identification, detection of the room occupant. Internet of Things (IoT) along with machine learning proved the increase of the smartness of the building by providing real-time data acquisition using sensors and actuators for prediction mechanisms. This paper proposes the implementation of an IoT framework to capture indoor environmental parameters for occupancy multivariate time-series data. The application of the Long Short Term Memory (LSTM) Deep Learning algorithm is used to infer the knowledge of the presence of human beings. An experiment is conducted in an office room using multivariate time-series as predictors in the regression forecasting problem. The results obtained demonstrate that with the developed system it is possible to obtain, process, and store environmental information. The information collected was applied to the LSTM algorithm and compared with other machine learning algorithms. The compared algorithms are Support Vector Machine, Naïve Bayes Network, and Multilayer Perceptron Feed-Forward Network. The outcomes based on the parametric calibrations demonstrate that LSTM performs better in the context of the proposed application

    An Efficient LoRa-Enabled Smart Fault Detection and Monitoring Platform for the Power Distribution System Using Self-Powered IoT Devices

    No full text
    Transient stability and supply disturbances are common yet unwelcome phenomena in power distribution systems, particularly in sub-Saharan Africa. The growing demand for greater reliability and dependability in power delivery has aroused the interest of researchers and renewed the pursuit of advanced technological solutions for fault detection and location determination at medium and low-voltage levels. The length of the distribution network typically ranges from hundreds to thousands of kilometers. In this regard, the management of distribution networks, including the identification of faulty segments, is a significant recurrent challenge facing power-system operators. With the ever-expanding distribution network and regulatory demands for service reliability, the challenge for network operators is daunting. However, the deployment of IoT technologies in the energy distribution infrastructure would significantly accelerate the detection and location of faults, thus transforming the electricity delivery service into one that is responsive, communicative, attractive, and robust. This study proposes, designs, and implements a reasonably priced LoRaWAN-based IoT platform for monitoring distribution networks. The study was conducted in Nakuru County, Kenya on an actual and active distribution network owned and managed by Kenya Power Company. Experimental results showed that a trigger is generated at the network-monitoring center in about 100 ms of the occurrence of a fault on the distribution network, thus enabling quick, prompt, and immediate commencement of reparative action. Furthermore, practical evaluation has shown that this platform is well adapted for the context of developing countries where budgetary constraints and cost prohibitions hinder the upgrade of the legacy grid into fully-fledged smart entities

    Battery-Powered RSU Running Time Monitoring and Prediction Using ML Model Based on Received Signal Strength and Data Transmission Frequency in V2I Applications

    No full text
    The application of the Internet of Things (IoT), vehicles to infrastructure (V2I) communication and intelligent roadside units (RSU) are promising paradigms to improve road traffic safety. However, for the RSUs to communicate with the vehicles and transmit the data to the remote location, RSUs require enough power and good network quality. Recent advances in technology have improved lithium-ion battery capabilities. However, other complementary methodologies including battery management systems (BMS) have to be developed to provide an early warning sign of the battery’s state of health. In this paper, we have evaluated the impact of the received signal strength indication (RSSI) and the current consumption at different transmission frequencies on a static battery-based RSU that depends on the global system for mobile communications (GSM)/general packet radio services (GPRS). Machine learning (ML) models, for instance, Random Forest (RF) and Support Vector Machine (SVM), were employed and tested on the collected data and later compared using the coefficient of determination (R2). The models were used to predict the battery current consumption based on the RSSI of the location where the RSUs were imposed and the frequency at which the RSU transmits the data to the remote database. The RF was preferable to SVM for predicting current consumption with an R2 of 98% and 94%, respectively. It is essential to accurately forecast the battery health of RSUs to assess their dependability and running time. The primary duty of the BMS is to estimate the status of the battery and its dynamic operating limits. However, achieving an accurate and robust battery state of charge remains a significant challenge. Referring to that can help road managers make alternative decisions, such as replacing the battery before the RSU power source gets drained. The proposed method can be deployed in other remote WSN and IoT-based applications
    corecore